113 research outputs found

    Optimal measurement of visual motion across spatial and temporal scales

    Full text link
    Sensory systems use limited resources to mediate the perception of a great variety of objects and events. Here a normative framework is presented for exploring how the problem of efficient allocation of resources can be solved in visual perception. Starting with a basic property of every measurement, captured by Gabor's uncertainty relation about the location and frequency content of signals, prescriptions are developed for optimal allocation of sensors for reliable perception of visual motion. This study reveals that a large-scale characteristic of human vision (the spatiotemporal contrast sensitivity function) is similar to the optimal prescription, and it suggests that some previously puzzling phenomena of visual sensitivity, adaptation, and perceptual organization have simple principled explanations.Comment: 28 pages, 10 figures, 2 appendices; in press in Favorskaya MN and Jain LC (Eds), Computer Vision in Advanced Control Systems using Conventional and Intelligent Paradigms, Intelligent Systems Reference Library, Springer-Verlag, Berli

    Mechanisms of spatiotemporal selectivity in cortical area MT

    Get PDF
    Cortical sensory neurons are characterized by selectivity to stimulation. This selectivity was originally viewed as a part of the fundamental “receptive field” characteristic of neurons. This view was later challenged by evidence that receptive fields are modulated by stimuli outside of the classical receptive field. Here we show that even this modified view of selectivity needs revision. We measured spatial frequency selectivity of neurons in cortical area MT of alert monkeys and found that their selectivity strongly depends on luminance contrast, shifting to higher spatial frequencies as contrast increases. The changes of preferred spatial frequency are large at low temporal frequency and they decrease monotonically as temporal frequency increases. That is, even interactions among basic stimulus dimensions of luminance contrast, spatial frequency and temporal frequency strongly influence neuronal selectivity. This dynamic nature of neuronal selectivity is inconsistent with the notion of stimulus preference as a stable characteristic of cortical neurons

    Interaction of perceptual grouping and crossmodal temporal capture in tactile apparent-motion

    Get PDF
    Previous studies have shown that in tasks requiring participants to report the direction of apparent motion, task-irrelevant mono-beeps can "capture'' visual motion perception when the beeps occur temporally close to the visual stimuli. However, the contributions of the relative timing of multimodal events and the event structure, modulating uni- and/or crossmodal perceptual grouping, remain unclear. To examine this question and extend the investigation to the tactile modality, the current experiments presented tactile two-tap apparent-motion streams, with an SOA of 400 ms between successive, left-/right-hand middle-finger taps, accompanied by task-irrelevant, non-spatial auditory stimuli. The streams were shown for 90 seconds, and participants' task was to continuously report the perceived (left-or rightward) direction of tactile motion. In Experiment 1, each tactile stimulus was paired with an auditory beep, though odd-numbered taps were paired with an asynchronous beep, with audiotactile SOAs ranging from -75 ms to 75 ms. Perceived direction of tactile motion varied systematically with audiotactile SOA, indicative of a temporal-capture effect. In Experiment 2, two audiotactile SOAs-one short (75 ms), one long (325 ms)-were compared. The long-SOA condition preserved the crossmodal event structure (so the temporal-capture dynamics should have been similar to that in Experiment 1), but both beeps now occurred temporally close to the taps on one side (even-numbered taps). The two SOAs were found to produce opposite modulations of apparent motion, indicative of an influence of crossmodal grouping. In Experiment 3, only odd-numbered, but not even-numbered, taps were paired with auditory beeps. This abolished the temporal-capture effect and, instead, a dominant percept of apparent motion from the audiotactile side to the tactile-only side was observed independently of the SOA variation. These findings suggest that asymmetric crossmodal grouping leads to an attentional modulation of apparent motion, which inhibits crossmodal temporal-capture effects

    Continuous Evolution of Statistical Estimators for Optimal Decision-Making

    Get PDF
    In many everyday situations, humans must make precise decisions in the presence of uncertain sensory information. For example, when asked to combine information from multiple sources we often assign greater weight to the more reliable information. It has been proposed that statistical-optimality often observed in human perception and decision-making requires that humans have access to the uncertainty of both their senses and their decisions. However, the mechanisms underlying the processes of uncertainty estimation remain largely unexplored. In this paper we introduce a novel visual tracking experiment that requires subjects to continuously report their evolving perception of the mean and uncertainty of noisy visual cues over time. We show that subjects accumulate sensory information over the course of a trial to form a continuous estimate of the mean, hindered only by natural kinematic constraints (sensorimotor latency etc.). Furthermore, subjects have access to a measure of their continuous objective uncertainty, rapidly acquired from sensory information available within a trial, but limited by natural kinematic constraints and a conservative margin for error. Our results provide the first direct evidence of the continuous mean and uncertainty estimation mechanisms in humans that may underlie optimal decision making

    Multisensory Oddity Detection as Bayesian Inference

    Get PDF
    A key goal for the perceptual system is to optimally combine information from all the senses that may be available in order to develop the most accurate and unified picture possible of the outside world. The contemporary theoretical framework of ideal observer maximum likelihood integration (MLI) has been highly successful in modelling how the human brain combines information from a variety of different sensory modalities. However, in various recent experiments involving multisensory stimuli of uncertain correspondence, MLI breaks down as a successful model of sensory combination. Within the paradigm of direct stimulus estimation, perceptual models which use Bayesian inference to resolve correspondence have recently been shown to generalize successfully to these cases where MLI fails. This approach has been known variously as model inference, causal inference or structure inference. In this paper, we examine causal uncertainty in another important class of multi-sensory perception paradigm – that of oddity detection and demonstrate how a Bayesian ideal observer also treats oddity detection as a structure inference problem. We validate this approach by showing that it provides an intuitive and quantitative explanation of an important pair of multi-sensory oddity detection experiments – involving cues across and within modalities – for which MLI previously failed dramatically, allowing a novel unifying treatment of within and cross modal multisensory perception. Our successful application of structure inference models to the new ‘oddity detection’ paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain

    Using curvature information in haptic shape perception of 3D objects

    Get PDF
    Are humans able to perceive the circularity of a cylinder that is grasped by the hand? This study presents the findings of an experiment in which cylinders with a circular cross-section had to be distinguished from cylinders with an elliptical cross-section. For comparison, the ability to distinguish a square cuboid from a rectangular cuboid was also investigated. Both elliptical and rectangular shapes can be characterized by the aspect ratio, but elliptical shapes also contain curvature information. We found that an elliptical shape with an aspect ratio of only 1.03 could be distinguished from a circular shape both in static and dynamic touch. However, for a rectangular shape, the aspect ratio needed to be about 1.11 for dynamic touch and 1.15 for static touch in order to be discernible from a square shape. We conclude that curvature information can be employed in a reliable and efficient manner in the perception of 3D shapes by touch

    Combining eye and hand in search is suboptimal

    Get PDF
    When performing everyday tasks, we often move our eyes and hand together: we look where we are reaching in order to better guide the hand. This coordinated pattern with the eye leading the hand is presumably optimal behaviour. But eyes and hands can move to different locations if they are involved in different tasks. To find out whether this leads to optimal performance, we studied the combination of visual and haptic search. We asked ten participants to perform a combined visual and haptic search for a target that was present in both modalities and compared their search times to those on visual only and haptic only search tasks. Without distractors, search times were faster for visual search than for haptic search. With many visual distractors, search times were longer for visual than for haptic search. For the combined search, performance was poorer than the optimal strategy whereby each modality searched a different part of the display. The results are consistent with several alternative accounts, for instance with vision and touch searching independently at the same time

    Duration of Coherence Intervals in Electrical Brain Activity in Perceptual Organization

    Get PDF
    We investigated the relationship between visual experience and temporal intervals of synchronized brain activity. Using high-density scalp electroencephalography, we examined how synchronized activity depends on visual stimulus information and on individual observer sensitivity. In a perceptual grouping task, we varied the ambiguity of visual stimuli and estimated observer sensitivity to this variation. We found that durations of synchronized activity in the beta frequency band were associated with both stimulus ambiguity and sensitivity: the lower the stimulus ambiguity and the higher individual observer sensitivity the longer were the episodes of synchronized activity. Durations of synchronized activity intervals followed an extreme value distribution, indicating that they were limited by the slowest mechanism among the multiple neural mechanisms engaged in the perceptual task. Because the degree of stimulus ambiguity is (inversely) related to the amount of stimulus information, the durations of synchronous episodes reflect the amount of stimulus information processed in the task. We therefore interpreted our results as evidence that the alternating episodes of desynchronized and synchronized electrical brain activity reflect, respectively, the processing of information within local regions and the transfer of information across regions

    Audiovisual time perception is spatially specific

    Get PDF
    Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways
    corecore